📚 node [[recurrent_neural_network|recurrent neural network]]
Welcome! Nobody has contributed anything to 'recurrent_neural_network|recurrent neural network' yet. You can:
  • Write something in the document below!
    • There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
  • Write to the Agora from social media.
    • If you follow Agora bot on a supported platform and include the wikilink [[recurrent_neural_network|recurrent neural network]] in a post, the Agora will link it here and optionally integrate your writing.
  • Sign up as a full Agora user.
    • As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
⥅ related node [[recurrent neural networks in nlp]]
⥅ related node [[rnns recurrent neural networks]]
⥅ related node [[recurrent_neural_network]]
⥅ node [[recurrent-neural-networks-in-nlp]] pulled by Agora

Recurrent Neural Networks in NLP

Go to [[Week 2 - Introduction]] or back to the [[Main AI Page]] Part of the page on [[Artificial Intelligence/Introduction to AI/Week 2 - Introduction/Natural Language Processing]] See [[RNNS - Recurrent neural networks]] for more detail.

As words move through the neurons, the next batch of inputs enter the network in such a way that late-stage words form part of the processing of early-stage words.

This ensures context is taken into account when processing language.

Traditional RNNs are trained through back-propagation through time (BPTT).

Here's an example of what BPTT looks like.

⥅ node [[recurrent_neural_network]] pulled by Agora

recurrent neural network

Go back to the [[AI Glossary]]

#seq

A neural network that is intentionally run multiple times, where parts of each run feed into the next run. Specifically, hidden layers from the previous run provide part of the input to the same hidden layer in the next run. Recurrent neural networks are particularly useful for evaluating sequences, so that the hidden layers can learn from previous runs of the neural network on earlier parts of the sequence.

For example, the following figure shows a recurrent neural network that runs four times. Notice that the values learned in the hidden layers from the first run become part of the input to the same hidden layers in the second run. Similarly, the values learned in the hidden layer on the second run become part of the input to the same hidden layer in the third run. In this way, the recurrent neural network gradually trains and predicts the meaning of the entire sequence rather than just the meaning of individual words.

An RNN that runs four times to process four input words.

📖 stoas
⥱ context